Inexact descent methods for elastic parameter optimization

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Descent Methods for Tuning Parameter Refinement

This paper addresses multidimensional tuning parameter selection in the context of “train-validate-test” and K-fold cross validation. A coarse grid search over tuning parameter space is used to initialize a descent method which then jointly optimizes over variables and tuning parameters. We study four regularized regression methods and develop the update equations for the corresponding descent ...

متن کامل

Descent methods for optimization on homogeneous manifolds

preprint numerics no. 1/2007 norwegian university of science and technology trondheim, norway 1 In this article we present a framework for line search methods for optimization on smooth homogeneous manifolds, with particular emphasis to the Lie group of real orthogonal matrices. We propose strategies of univariate descent (UVD) methods. The advantages of this approach are that the optimization ...

متن کامل

Inexact Alternating Direction Methods of Multipliers for Separable Convex Optimization

Abstract. Inexact alternating direction multiplier methods (ADMMs) are developed for solving general separable convex optimization problems with a linear constraint and with an objective that is the sum of smooth and nonsmooth terms. The approach involves linearized subproblems, a back substitution step, and either gradient or accelerated gradient techniques. Global convergence is established. ...

متن کامل

Adaptive Multilevel Inexact SQP Methods for PDE-Constrained Optimization

We present a class of inexact adaptive multilevel trust-region SQP-methods for the efficient solution of optimization problems governed by nonlinear partial differential equations. The algorithm starts with a coarse discretization of the underlying optimization problem and provides during the optimization process 1) implementable criteria for an adaptive refinement strategy of the current discr...

متن کامل

Exact and Inexact Subsampled Newton Methods for Optimization

The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Transactions on Graphics

سال: 2019

ISSN: 0730-0301,1557-7368

DOI: 10.1145/3272127.3275021